skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Leok, Melvin"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available June 30, 2026
  2. Motivated by recent developments in Hamiltonian variational principles, Hamiltonian variational integrators, and their applications such as to optimization and control, we present a new Type II variational approach for Hamiltonian systems, based on a virtual work principle that enforces the Type II boundary conditions through a combination of essential and natural boundary conditions; particularly, this approach allows us to define this variational principle intrinsically on manifolds. We first develop this variational principle on vector spaces and subsequently extend it to parallelizable manifolds, general manifolds, as well as to the infinite-dimensional setting. Furthermore, we provide a review of variational principles for Hamiltonian systems in various settings as well as their applications. 
    more » « less
    Free, publicly-accessible full text available March 1, 2026
  3. Free, publicly-accessible full text available March 1, 2026
  4. Geometric numerical integration has recently been exploited to design symplectic accelerated optimization algorithms by simulating the Bregman Lagrangian and Hamiltonian systems from the variational framework introduced by Wibisono et al. In this paper, we discuss practical considerations which can significantly boost the computational performance of these optimization algorithms and considerably simplify the tuning process. In particular, we investigate how momentum restarting schemes ameliorate computational efficiency and robustness by reducing the undesirable effect of oscillations and ease the tuning process by making time-adaptivity superfluous. We also discuss how temporal looping helps avoiding instability issues caused by numerical precision, without harming the computational efficiency of the algorithms. Finally, we compare the efficiency and robustness of different geometric integration techniques and study the effects of the different parameters in the algorithms to inform and simplify tuning in practice. From this paper emerge symplectic accelerated optimization algorithms whose computational efficiency, stability and robustness have been improved, and which are now much simpler to use and tune for practical applications. 
    more » « less
  5. Abstract Adjoint systems are widely used to inform control, optimization, and design in systems described by ordinary differential equations or differential-algebraic equations. In this paper, we explore the geometric properties and develop methods for such adjoint systems. In particular, we utilize symplectic and presymplectic geometry to investigate the properties of adjoint systems associated with ordinary differential equations and differential-algebraic equations, respectively. We show that the adjoint variational quadratic conservation laws, which are key to adjoint sensitivity analysis, arise from (pre)symplecticity of such adjoint systems. We discuss various additional geometric properties of adjoint systems, such as symmetries and variational characterizations. For adjoint systems associated with a differential-algebraic equation, we relate the index of the differential-algebraic equation to the presymplectic constraint algorithm of Gotay et al. (J Math Phys 19(11):2388–2399, 1978). As an application of this geometric framework, we discuss how the adjoint variational quadratic conservation laws can be used to compute sensitivities of terminal or running cost functions. Furthermore, we develop structure-preserving numerical methods for such systems using Galerkin Hamiltonian variational integrators (Leok and Zhang in IMA J. Numer. Anal. 31(4):1497–1532, 2011) which admit discrete analogues of these quadratic conservation laws. We additionally show that such methods are natural, in the sense that reduction, forming the adjoint system, and discretization all commute, for suitable choices of these processes. We utilize this naturality to derive a variational error analysis result for the presymplectic variational integrator that we use to discretize the adjoint DAE system. Finally, we discuss the application of adjoint systems in the context of optimal control problems, where we prove a similar naturality result. 
    more » « less
  6. Incorporating prior knowledge of physics laws and structural properties of dynamical systems into the design of deep learning architectures has proven to be a powerful technique for improving their computational efficiency and generalization capacity. Learning accurate models of robot dynamics is critical for safe and stable control. Autonomous mobile robots, including wheeled, aerial, and underwater vehicles, can be modeled as controlled Lagrangian or Hamiltonian rigid-body systems evolving on matrix Lie groups. In this paper, we introduce a new structure-preserving deep learning architecture, the Lie group Forced Variational Integrator Network (LieFVIN), capable of learning controlled Lagrangian or Hamiltonian dynamics on Lie groups, either from position-velocity or position-only data. By design, LieFVINs preserve both the Lie group structure on which the dynamics evolve and the symplectic structure underlying the Hamiltonian or Lagrangian systems of interest. The proposed architecture learns surrogate discrete-time flow maps allowing accurate and fast prediction without numerical-integrator, neural-ODE, or adjoint techniques, which are needed for vector fields. Furthermore, the learnt discrete-time dynamics can be utilized with computationally scalable discrete-time (optimal) control strategies. 
    more » « less
  7. A variational framework for accelerated optimization was recently introduced on normed vector spaces and Riemannian manifolds in [1] and [2]. It was observed that a careful combination of time-adaptivity and symplecticity in the numerical integration can result in a significant gain in computational efficiency. It is however well known that symplectic integrators lose their near-energy preservation properties when variable time-steps are used. The most common approach to circumvent this problem involves the Poincaré transformation on the Hamiltonian side, and was used in [3] to construct efficient explicit algorithms for symplectic accelerated optimization. However, the current formulations of Hamiltonian variational integrators do not make intrinsic sense on more general spaces such as Riemannian manifolds and Lie groups. In contrast, Lagrangian variational integrators are well-defined on manifolds, so we develop here a framework for time-adaptivity in Lagrangian variational integrators and use the resulting geometric integrators to solve optimization problems on vector spaces and Lie groups. 
    more » « less
  8. Riemannian submanifold optimization with momentum is computationally challenging because, to ensure that the iterates remain on the submanifold, we often need to solve difficult differential equations. Here, we simplify such difficulties for a class of structured symmetric positive-definite matrices with the affine-invariant metric. We do so by proposing a generalized version of the Riemannian normal coordinates that dynamically orthonormalizes the metric and locally converts the problem into an unconstrained problem in the Euclidean space. We use our approach to simplify existing approaches for structured covariances and develop matrix-inverse-free 2nd-order optimizers for deep learning in low precision settings. 
    more » « less